-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support catalog for Spark Snowflake #432
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@luoyedeyi Thanks for your contribution on the spark connector. Left some comments for this PR. Could you please take a look?
.github/workflows/snyk-issue.yml
Outdated
@@ -0,0 +1,29 @@ | |||
name: Snyk Issue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file sounds not related to this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
.github/workflows/snyk-pr.yml
Outdated
@@ -0,0 +1,31 @@ | |||
name: snyk-pr |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file sounds not related to this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
import scala.collection.convert.ImplicitConversions.`map AsScala` | ||
import scala.collection.mutable.ArrayBuilder | ||
|
||
class SfCatalog extends TableCatalog with Logging with SupportsNamespaces{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please check the scala style with sbt scalastyle
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
import scala.collection.convert.ImplicitConversions.`map AsScala` | ||
import scala.collection.mutable.ArrayBuilder | ||
|
||
class SfCatalog extends TableCatalog with Logging with SupportsNamespaces{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks SfCatlog
will work as a utility class for Spark Connector user correct?
Can you add some unit test and integration test for it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure.
ident: Identifier, | ||
schema: StructType, | ||
partitions: Array[Transform], | ||
properties: java.util.Map[String, String]): Table = ??? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
???
sounds misleading. If you think the interface functions are not applicable for Spark Connector, I just you throw an UnsupportedOperationException
This comment applied to the other similar cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
import org.apache.spark.sql.types.StructType | ||
import org.apache.spark.sql.{Row, SQLContext} | ||
|
||
case class SfScan( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain how SfScan/SfScanBuilder/SfWriteBuilder/SfTable
are used.
They are defined in this PR but I don't understand when and how they are used?
Could you please add some test cases for them?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure. Just wanna implement some datasource interfaces already inside Spark to enable reading and sinking data on Snowflake by catalog.
writer.save(data.sqlContext, data, saveMode, params) | ||
} | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add empty line in the end of the file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK
� Conflicts: � .github/workflows/snyk-issue.yml � .github/workflows/snyk-pr.yml
1cd52d0
to
be8d050
Compare
I refactored the code and please review it @sfc-gh-mrui |
No description provided.